A novel weight initialization method for the random neural network

نویسنده

  • Stelios Timotheou
چکیده

In this paper, we propose a novel weight initialization method for the Random Neural Network. The method relies on approximating the signal-flow equations of the network to obtain a linear system of equations with nonnegativity constraints. For the solution of the formulated linear Nonnegative Least Squares problem we have developed an improved projected gradient algorithm. It is shown that supervised learning with the developed initialization method has better performance both in term of solution quality and execution time than learning with random initialization when applied to a combinatorial optimization emergency response problem.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

IDIAP Technical report

Proper initialization is one of the most important prerequisites for fast convergence of feed-forward neural networks like high order and multilayer perceptrons. This publication aims at determining the optimal value of the initial weight v ariance (or range), which is the principal parameter of random weight initialization methods for both types of neural networks. An overview of random weight...

متن کامل

Nonparametric Weight Initialization of Neural Networks via Integral Representation

A new initialization method for hidden parameters in a neural network is proposed. Derived from the integral representation of neural networks, a nonparametric probability distribution of hidden parameters is introduced. In this proposal, hidden parameters are initialized by samples drawn from this distribution, and output parameters are fitted by ordinary linear regression. Numerical experimen...

متن کامل

Algorithms for Initialization of Neural Network Weights

The paper is devoted to the comparison of different approaches to initialization of neural network weights. Most algorithms based on various levels of modification of random weight initialization are used for the multilayer artificial neural networks. Proposed methods were verified for simulated signals at first and then used for modelling of real data of gas consumption in the Czech Republic.

متن کامل

Kernel Reparametrization Trick

While deep neural networks have achieved state-of-the-art performance on many tasks across varied domains, they still remain black boxes whose inner workings are hard to interpret and understand. In this paper, we develop a novel method for efficiently capturing the behaviour of deep neural networks using kernels. In particular, we construct a hierarchy of increasingly complex kernels that enco...

متن کامل

Generalizing and Improving Weight Initialization

We propose a new weight initialization suited for arbitrary nonlinearities by generalizing previous weight initializations. The initialization corrects for the influence of dropout rates and an arbitrary nonlinearity’s influence on variance through simple corrective scalars. Consequently, this initialization does not require computing mini-batch statistics nor weight pre-initialization. This si...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Neurocomputing

دوره 73  شماره 

صفحات  -

تاریخ انتشار 2009